SynonymX
Speak
Markov Chain
Definition of Markov Chain
1.
A Markov process for which the parameter is discrete time values
Noun
Synonyms for word "Markov chain"
markoff chain
Semanticaly linked words with "Markov chain"
markoff process
markov process
Hyponims for word "Markov chain"
markoff process
markov process